Speeding-Up Convergence via Sequential Subspace Optimization: Current State and Future Directions
نویسنده
چکیده
This is an overview paper written in style of research proposal. In recent years we introduced a general framework for large-scale unconstrained optimization – Sequential Subspace Optimization (SESOP) and demonstrated its usefulness for sparsity-based signal/image denoising, deconvolution, compressive sensing, computed tomography, diffraction imaging, support vector machines. We explored its combination with Parallel Coordinate Descent and Separable Surrogate Function methods, obtaining state of the art results in above-mentioned areas. There are several methods, that are faster than plain SESOP under specific conditions: Trust region Newton method for problems with easily invertible Hessian matrix; Truncated Newton method when fast multiplication by Hessian is available; Stochastic optimization methods for problems with large stochastic-type data; Multigrid methods for problems with nested multilevel structure. Each of these methods can be further improved by merge with SESOP. One can also accelerate Augmented Lagrangian method for constrained optimization problems and Alternating Direction Method of Multipliers for problems with separable objective function and non-separable constraints.
منابع مشابه
A TRUST-REGION SEQUENTIAL QUADRATIC PROGRAMMING WITH NEW SIMPLE FILTER AS AN EFFICIENT AND ROBUST FIRST-ORDER RELIABILITY METHOD
The real-world applications addressing the nonlinear functions of multiple variables could be implicitly assessed through structural reliability analysis. This study establishes an efficient algorithm for resolving highly nonlinear structural reliability problems. To this end, first a numerical nonlinear optimization algorithm with a new simple filter is defined to locate and estimate the most ...
متن کاملConvergence Analysis of Prediction Markets via Randomized Subspace Descent
Prediction markets are economic mechanisms for aggregating information about future events through sequential interactions with traders. The pricing mechanisms in these markets are known to be related to optimization algorithms in machine learning and through these connections we have some understanding of how equilibrium market prices relate to the beliefs of the traders in a market. However, ...
متن کاملMinimizing a Quadratic Over a Sphere
A new method, the sequential subspace method (SSM), is developed for the problem of minimizing a quadratic over a sphere. In our scheme, the quadratic is minimized over a subspace which is adjusted in successive iterations to ensure convergence to an optimum. When a sequential quadratic programming iterate is included in the subspace, convergence is locally quadratic. Numerical comparisons with...
متن کاملSpeeding up COMPASS for high-dimensional discrete optimization via simulation
The convergent optimization via most promising area stochastic search (COMPASS) algorithm is a locally convergent randomsearch algorithm for solving discrete optimization via simulation problems. COMPASS has drawn a significant amount of attention since its introduction. While the asymptotic convergence of COMPASS does not depend on the problem dimension, the finite-time performance of the algo...
متن کاملSequential Optimality Conditions and Variational Inequalities
In recent years, sequential optimality conditions are frequently used for convergence of iterative methods to solve nonlinear constrained optimization problems. The sequential optimality conditions do not require any of the constraint qualications. In this paper, We present the necessary sequential complementary approximate Karush Kuhn Tucker (CAKKT) condition for a point to be a solution of a ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1401.0159 شماره
صفحات -
تاریخ انتشار 2013